Best Nonnegative Rank-One Approximations of Tensors

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonnegative Approximations of Nonnegative Tensors

We study the decomposition of a nonnegative tensor into a minimal sum of outer product of nonnegative vectors and the associated parsimonious näıve Bayes probabilistic model. We show that the corresponding approximation problem, which is central to nonnegative parafac, will always have optimal solutions. The result holds for any choice of norms and, under a mild assumption, even Brègman diverge...

متن کامل

Nonnegative approximations of nonnegative tensors

We study the decomposition of a nonnegative tensor into a minimal sum of outer product of nonnegative vectors and the associated parsimonious näıve Bayes probabilistic model. We show that the corresponding approximation problem, which is central to nonnegative parafac, will always have optimal solutions. The result holds for any choice of norms and, under a mild assumption, even Brègman diverge...

متن کامل

On best rank one approximation of tensors

Today, compact and reduced data representations using low rank data approximation are common to represent high-dimensional data sets in many application areas as for example genomics, multimedia, quantum chemistry, social networks or visualization. In order to produce such low rank data representations, the input data is typically approximated by so-called alternating least squares (ALS) algori...

متن کامل

Complex Tensors Almost Always Have Best Low-rank Approximations

Low-rank tensor approximations are plagued by a well-known problem — a tensor may fail to have a best rank-r approximation. Over R, it is known that such failures can occur with positive probability, sometimes with certainty: in R2×2×2, every tensor of rank 3 fails to have a best rank-2 approximation. We will show that while such failures still occur over C, they happen with zero probability. I...

متن کامل

Successive Rank-One Approximations of Nearly Orthogonally Decomposable Symmetric Tensors

Many idealized problems in signal processing, machine learning and statistics can be reduced to the problem of finding the symmetric canonical decomposition of an underlying symmetric and orthogonally decomposable (SOD) tensor. Drawing inspiration from the matrix case, the successive rank-one approximations (SROA) scheme has been proposed and shown to yield this tensor decomposition exactly, an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Matrix Analysis and Applications

سال: 2019

ISSN: 0895-4798,1095-7162

DOI: 10.1137/18m1224064